Consistent selection via the Lasso
نویسندگان
چکیده
In this article we investigate consistency of selection in regression models via the popular Lasso method. Here we depart from the traditional linear regression assumption and consider approximations of the regression function f with elements of a given dictionary of M functions. The target for consistency is the index set of those functions from this dictionary that realize the most parsimonious approximation to f among all linear combinations belonging to an L2 ball centered at f and of radius r n,M . In this framework we show that a consistent estimate of this index set can be derived via l1 penalized least squares, with a data dependent penalty and with tuning sequence rn,M > √ log(Mn)/n, where n is the sample size. Our results hold for any 1 ≤ M ≤ n , for any γ > 0.
منابع مشابه
Autoregressive process modeling via the Lasso procedure
The Lasso is a popular model selection and estimation procedure for linear models that enjoys nice theoretical properties. In this paper, we study the Lasso estimator for fitting autoregressive time series models. We adopt a double asymptotic framework where the maximal lag may increase with the sample size. We derive theoretical results establishing various types of consistency. In particular,...
متن کاملSome Two-Step Procedures for Variable Selection in High-Dimensional Linear Regression
We study the problem of high-dimensional variable selection via some two-step procedures. First we show that given some good initial estimator which is l∞-consistent but not necessarily variable selection consistent, we can apply the nonnegative Garrote, adaptive Lasso or hard-thresholding procedure to obtain a final estimator that is both estimation and variable selection consistent. Unlike th...
متن کاملModel selection via standard error adjusted adaptive lasso
The adaptive lasso is a model selection method shown to be both consistent in variable selection and asymptotically normal in coefficient estimation. The actual variable selection performance of the adaptive lasso depends on the weight used. It turns out that the weight assignment using the OLS estimate (OLS-adaptive lasso) can result in very poor performance when collinearity of the model matr...
متن کاملPath consistent model selection in additive risk model via Lasso.
As a flexible alternative to the Cox model, the additive risk model assumes that the hazard function is the sum of the baseline hazard and a regression function of covariates. For right censored survival data when variable selection is needed along with model estimation, we propose a path consistent model selector using a modified Lasso approach, under the additive risk model assumption. We sho...
متن کاملThe Adaptive Lasso and Its Oracle Properties
The lasso is a popular technique for simultaneous estimation and variable selection. Lasso variable selection has been shown to be consistent under certain conditions. In this work we derive a necessary condition for the lasso variable selection to be consistent. Consequently, there exist certain scenarios where the lasso is inconsistent for variable selection. We then propose a new version of ...
متن کاملConsistent group selection in high-dimensional linear regression.
In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selec...
متن کامل